Web Survey Bibliography
In the last two decades, Web or Internet surveys have had a profound impact on the survey world. The change has been felt mostly strongly in the market research sector, with many companies switching from telephone surveys or other modes of data collection to online surveys. The academic and public policy/social attitude sectors were a little slower to adopt, being more careful about evaluating the effect of the change on key surveys and trends, and conducting research on how best to design and implement Web surveys. The public sector (i.e., government statistical offices) has been the slowest to embrace Web surveys, in part because the stakes are much higher, both in terms of the precision requirements of the estimates and in terms of the public scrutiny of such data. However, National Statistical Offices (NSOs) are heavily engaged in research and development with regard to Web surveys, mostly notably as part of a mixedmode data collection strategy, or in the establishment survey world, where repeated measurement and quick turnaround are the norm. Along with the uneven progress in the adoption of Web surveys has come a number of concerns about the method, particularly with regard to the representational or inferential aspects of Web surveys. At the same time, a great deal of research has been conducted on the measurement side of Web surveys, developing ways to improve the quality of data collected using this medium. This seminar focuses on these two key elements of Web surveys — inferential issues and measurement issues. Each of these broad areas will be covered in turn in the following sections. The inferential section is largely concerned with methods of sampling for Web surveys, and the associated coverage and nonresponse issues. Different ways in which samples are drawn, using both non-probability and probability-based approaches, are discussed. The assumptions behind the different approaches to inference in Web surveys, the benefits and risks inherent in the different approaches, and the appropriate use of particular approaches to sample selection in Web surveys, are reviewed. The following section then addresses a variety of issues related to the design of Web survey instruments, with a review of the empirical literature and practical recommendations for design to minimize measurement error.
A total survey error framework (see Deming, 1944; Kish, 1965; Groves, 1989) is useful for evaluating the quality or value of a method of data collection such as Web or Internet surveys. In this framework, there are several different sources of error in surveys, and these can be divided into two main groups: errors of non-observation and errors of observation. Errors of nonobservation refer to failures to observe or measure eligible members of the population of interest, and can include coverage errors, sampling errors, and nonresponse errors. Errors of nonobservation are primarily concerned about issues of selection bias. Errors of observation are also called measurement errors (see Biemer et al., 1991; Lessler and Kalsbeeck, 1992). Sources of measurement error include the respondent, the instrument, the mode of data collection and (in interviewer-administered surveys) the interviewer. In addition, processing errors can affect all types of surveys. Errors can also be classified according to whether they affect the variance or bias of survey estimates, both contributing to overall mean square error (MSE) of a survey statistic. A total survey error perspective aims to minimize mean square error for a set of survey statistics, given a set of resources. Thus, cost and time are also important elements in evaluating the quality of a survey. While Web surveys generally are significantly less expensive than other modes of data collection, and are quicker to conduct, there are serious concerns raised about errors of non-observation or selection bias. On the other hand, there is growing evidence that using Web surveys can improve the quality of the data collected (i.e., reduce measurement errors) relative to other modes, depending on how the instruments are designed. Given this framework, we first discuss errors of non-observation or selection bias that may raise concerns about the inferential value of Web surveys, particularly those targeted at the general population. Then in the second part we discuss ways that the design of the Web survey instrument can affect measurement errors.
EUSTAT Homepage (abstract) / (full text)
Web survey bibliography - Reports, seminars (231)
- Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys 2016; 2016
- Standard Definitions Final Dispositions of Case Codes and Outcome Rates for Surveys; 2016
- FocusVision 2015 Annual MR Technology Report; 2016; Macer, T., Wilson, S.
- Establishing the accuracy of online panels for survey research; 2016; Bruggen, E.; van den Brakel, J.; Krosnick, J. A.
- Mixing modes of data collection in Swiss social surveys: Methodological report of the LIVES-FORS mixed...; 2016; Roberts, C.; Joye, D.; Staehli, M. E.
- Assessment of Innovations in Data Collection Technology for Understanding Society; 2016; Couper, M. P.
- Report of the Inquiry into the 2015 British general election opinion polls; 2016; Sturgis, P., Baker, N., Callegaro, M., Fisher, St., Green, J., Jennings, W., Kuha, J., Lauderdale, B...
- Evaluating a New Proposal for Detecting Data Falsification in Surveys; 2016; Simmons, K.; Mercer, A. W.; Schwarzer, S.; Courtney, K.
- Computer-assisted and online data collection in general population surveys; 2016; Skarupova, K.
- Predictive inference for non-probability samples: a simulation study ; 2016; Buelens, B.; Burger, J.; van den Brakel, J.
- ESOMAR/GRBN Online Research Guideline; 2015
- App vs. Web for Surveys of Smartphone Users: Experimenting with mobile apps for signal-contingent experience...; 2015; McGeeney, K.; Keeter, S.; Igielnik, R.; Smith, A.; Rainie, L.
- On Climbing Stairs Many Steps at a Time: The New Normal in Survey Methodology; 2015; Dillman, D. A.
- Polling Error in the 2015 UK General Election: An Analysis of YouGov’s Pre and Post-Election Polls...; 2015; Wells, A.; Rivers, D.
- GreenBook Research Industry Trends Report; 2015; Murphy, L. (Ed.)
- Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys 2015; 2015
- Methodology of the RAND Mid-Term 2014 Election Panel; 2015; Carman, K. G; Pollack, S.
- 28 Questions to Help Buyers of Online Samples; 2015; Cape, P. J.; Phillips, A.; Baker, R.; Cooke, M.; Ribeiro, E.; Terhanian, G.
- Understanding Society Innovation Panel Wave 7: Results from Methodological Experiments; 2015; Blom, A. G.; Burton, J.; Booker, C. L.; Cernat, A.; Fairbrother, M.; Jaeckle, A.; Kaminska, O.; Keusch...
- Tips for Creating Web Surveys for Completion on a Mobile Device; 2015; McGeeney, K.
- U.S. Survey Research: Sampling; 2015
- A Comparison of Different Online Sampling Approaches for Generating National Samples; 2014; Heen, M. S. J., Lieberman, J. D., Miethe, T. D.
- FocusVision 2014 Annual MR Technology Report; 2014; Macer, T., Wilson, S.
- The Changing Landscape of Technology and its Effect on Online Survey Data Collection; 2014; Mitchell, N.
- Query on Data Collection for Social Surveys; 2014; Blanke, K., Luiten, A.
- The role of email addresses and email contact in encouraging web response in a mixed mode design ; 2014; Cernat, A., Lynn, P.
- Mixed-mode surveys of the general population - Results from the European Social Survey mixed-mode experiment...; 2014; Park, A., Humphrey, A.
- Mixed-Mode Designs bei Erhebungen mit sensitiven Fragen: Einfluss auf das Teilnahme- und Antwortverhalten...; 2014; Krug, G., Kriwy, P., Carstensen, J.
- Methods and systems for managing an online opinion survey service; 2014; Mcloughlin, M. H., Seton, N., Blesy, K.
- Mobile Technologies for Conducting, Augmenting and Potentially Replacing Surveys: Report of the AAPOR...; 2014; Link, M. W., Murphy, J., Schober, M. F., Buskirk, T. D., Childs, J. H., Tesfaye, C.
- The use of within-subject experiments for estimating measurement effects in mixed-mode surveys ; 2014; Klausch, L. T., Schouten, B., Hox, J.
- Measuring well-being: An analysis of different response scales; 2014; van Beuningen, J., van der Houwen, K., Moonen, L.
- The impact of contact effort and interviewer performance on mode-specific nonresponse and measurement...; 2014; Schouten, B., Cobben, F., van der Laan, J., Arends, J.
- Community Life Survey: Summary of web experiment findings; 2013
- The Short-term Campaign Panel of the German Longitudinal Election Study 2009. Design, Implementation...; 2013; Steinbrecher, M., Rossmann, J.
- Too Fast, Too Straight, Too Weird: Post Hoc Identification of Meaningless Data in Internet ; 2013; Leiner, D. J.
- Postal recruitment into a longitudinal online panel survey. The effects of different number of reminder...; 2013; Martinsson, J.
- The world in 2013. ICT facts and figures; 2013
- Microsoft Security Intelligence Report, Volume 15; 2013
- A Comparison of Results from a Spanish and English Mail Survey: Effects of Instruction Placement on...; 2013; Wang, K., Sha, M.
- Research Note: Reducing the Threat of Sensitive Questions in Online Surveys?; 2013; Couper, M. P.
- Global market research 2013; 2013
- Exploring the Digital Nation: America’s Emerging Online Experience; 2013
- Advantages of a global multimodal print & digital readership survey; 2013; Cour, N., Saint-Joanis, G.
- Australia: building a 21st century readership survey; 2013; Green, A., White, H.
- The new swiss national readership survey: fit for the future ; 2013; Amschler, H., Hoffmann, J.
- ESS Mixed Mode Experiment Results in Estonia (CAWI and CAPI Mode Sequential Design); 2013; Ainsaar, M., Lilleoja, L., Lumiste, K., Roots, A.
- Using smartphones in survey research: a multifunctional tool Implementation of a time use app; a feasability...; 2013; Sonck, N., Fernee, H.
- Adaptive survey designs to minimize survey mode effects. A case study on the Dutch Labour Force Survey...; 2013; Calinescu, M., Schouten, B.
- Optimal Resource Allocation in Adaptive Survey Designs; 2013; Calinescu, M.